Incentives for truthful reporting in crowdsourcing
نویسندگان
چکیده
A challenge with the programmatic access of human talent via crowdsourcing platforms is the specification of incentives and the checking of the quality of contributions. Methodologies for checking quality include providing a payment if the work is approved by the task owner and hiring additional workers to evaluate contributors’ work. Both of these approaches place a burden on people and on the organizations commissioning tasks, and may be susceptible to manipulation by workers and task owners. Moreover, neither a task owner nor the task market may know the task well enough to be able to evaluate worker reports. Methodologies for incentivizing workers without external quality checking include rewards based on agreement with a peer worker or with the final output of the system. These approaches are vulnerable to strategic manipulations by workers. Recent experiments on Mechanical Turk have demonstrated the negative influence of manipulations by workers and task owners on crowdsourcing systems [3]. We address this central challenge by introducing incentive mechanisms that promote truthful reporting in crowdsourcing and discourage manipulation by workers and task owners without introducing additional overhead. We focus on a large class of crowdsourcing tasks that we refer to as consensus tasks. Consensus tasks are aimed at determining a single correct answer or a set of correct answers to a question or challenge based on reports collected from workers. These tasks include numerous applications where multiple reports collected from people are used to make decisions. We adapt the peer prediction rule [4] to formulate a payment rule that incentivizes workers to contribute to
منابع مشابه
Peer Truth Serum: Incentives for Crowdsourcing Measurements and Opinions
Modern decision making tools are based on statistical analysis of abundant data, which is often collected by querying multiple individuals. We consider data collection through crowdsourcing, where independent and self-interested agents, non-experts, report measurements, such as sensor readings, opinions, such as product reviews, or answers to human intelligence tasks. Since the accuracy of info...
متن کاملIncentives and Truthful Reporting in Consensus-centric Crowdsourcing
We address the challenge in crowdsourcing systems of incentivizing people to contribute to the best of their abilities. We focus on the class of crowdsourcing tasks where contributions are provided in pursuit of a single correct answer. This class includes citizen science efforts that seek input from people with identifying events and states in the world. We introduce a new payment rule, called...
متن کاملLearning the Prior in Minimal Peer Prediction
Many crowdsourcing applications rely on the truthful elicitation of information from workers; e.g., voting on the quality of an image label, or whether a website is inappropriate for an advertiser. Peer prediction provides a theoretical mechanism for eliciting truthful reports. However, its application depends on knowledge of a full probabilistic model: both a distribution on votes, and a poste...
متن کاملIncentives for Subjective Evaluations with Private Beliefs
The modern web critically depends on aggregation of information from self-interested agents, for example opinion polls, product ratings, or crowdsourcing. We consider a setting where multiple objects (questions, products, tasks) are evaluated by a group of agents. We first construct a minimal peer prediction mechanism that elicits honest evaluations from a homogeneous population of agents with ...
متن کاملIncentives for Truthful Evaluations
We consider crowdsourcing problems where the users are asked to provide evaluations for items; the user evaluations are then used directly, or aggregated into a consensus value. Lacking an incentive scheme, users have no motive in making effort in completing the evaluations, providing inaccurate answers instead. We propose incentive schemes that are truthful and cheap: truthful as the optimal u...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2012